Anℓ1-oracle inequality for the Lasso in finite mixture Gaussian regression models
نویسندگان
چکیده
منابع مشابه
An Overview of the New Feature Selection Methods in Finite Mixture of Regression Models
Variable (feature) selection has attracted much attention in contemporary statistical learning and recent scientific research. This is mainly due to the rapid advancement in modern technology that allows scientists to collect data of unprecedented size and complexity. One type of statistical problem in such applications is concerned with modeling an output variable as a function of a sma...
متن کاملOracle Inequality for Instrumental Variable Regression
where φ is the parameter of interest which models the relationship while U is an error term. Contrary to usual statistical regression models, the error term is correlated with the explanatory variables X, hence E(U |X) 6= 0, preventing direct estimation of φ. To overcome the endogeneity of X, we assume that there exists an observed random variable W , called the instrument, which decorrelates t...
متن کاملImputation through finite Gaussian mixture models
Imputation is a widely used method for handling missing data. It consists in the replacement of missing values with plausible ones. Parametric and nonparametric techniques are generally adopted for modelling incomplete data. Both of them have advantages and drawbacks. Parametric techniques are parsimonious but depend on the model assumed, while nonparametric techniques are more flexible but req...
متن کاملGibbs sampling for fitting finite and infinite Gaussian mixture models
This document gives a high-level summary of the necessary details for implementing collapsed Gibbs sampling for fitting Gaussian mixture models (GMMs) following a Bayesian approach. The document structure is as follows. After notation and reference sections (Sections 2 and 3), the case for sampling the parameters of a finite Gaussian mixture model is described in Section 4. This is then extende...
متن کاملLasso Methods for Gaussian Instrumental Variables Models
In this note, we propose to use sparse methods (e.g. LASSO, Post-LASSO, √ LASSO, and Post√ LASSO) to form first-stage predictions and estimate optimal instruments in linear instrumental variables (IV) models with many instruments, p, in the canonical Gaussian case. The methods apply even when p is much larger than the sample size, n. We derive asymptotic distributions for the resulting IV estim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: ESAIM: Probability and Statistics
سال: 2013
ISSN: 1292-8100,1262-3318
DOI: 10.1051/ps/2012016